Corruption of Generalizing Signals in Densely Connected Feedforward Neural Networks with Hyperbolic Tangent Activation Functions
نویسنده
چکیده
This paper discusses the propagation of signals in generic densely connected multilayered feedforward neural networks. It is concluded that the dense connecting combined with the hyperbolic tangent activation functions of the neurons may cause a highly random, spurious generalization, that decreases the overall performance and reliability of a neural network and can be mistaken with overfitting. Modified activation functions instead of the hyperbolic tangent activation functions and an organized, instead of ad hoc, way of connecting neurons are discussed as possible ways of reducing the spurious generalization. keywords: feedforward neural networks, generalization, overfitting
منابع مشابه
Implementation of a programmable neuron in CNTFET technology for low-power neural networks
Circuit-level implementation of a novel neuron has been discussed in this article. A low-power Activation Function (AF) circuit is introduced in this paper, which is then combined with a highly linear synapse circuit to form the neuron architecture. Designed in Carbon Nanotube Field-Effect Transistor (CNTFET) technology, the proposed structure consumes low power, which makes it suitable for the...
متن کاملUniqueness of network parametrization and faster learning
Any single-hidden-layer feedforward network based on Gaussian or asymptotically constant odd or even rational non-polynomial activation functions has the same property as such networks based on hyperbolic tangent: input-output function determines weights and biases up to a permutation of the hidden units and sign-flips.
متن کاملFully Complex Backpropagation for Constant Envelope Signal Processing
One of the challenges in designing a neural network to process complexvalued signals is finding a suitable nonlinear complex activation function. The main reason for this difficulty is the conflict between the boundedness and the differentiability of complex functions in the entire complex plane, stated by Louiville’s theorem. To avoid this difficulty, ‘splitting‘, i.e., using two separate real...
متن کاملImplementation of Knowledege Based Neural Network with Hyperbolic Tanget Function
Neural networks have a wide range of applications in analog and digital signal processing Nonlinear activation function is one of the main building blocks of artificial neural networks. Hyperbolic tangent and sigmoid are the most used nonlinear activation functions of NN.This project proposes a knowledge-based neural network (KBNN) modeling approach with new hyperbolic tangent function . The KB...
متن کاملEffects of Non-uniform Suction, Heat Generation/Absorption and Chemical Reaction with Activation Energy on MHD Falkner-Skan Flow of Tangent Hyperbolic Nanofluid over a Stretching/Shrinking Eedge
In the present investigation, the magnetohydrodynamic Falkner-Skan flow of tangent hyperbolic nanofluids over a stretching/shrinking wedge with variable suction, internal heat generation/absorption and chemical reaction with activation energy have been scrutinized. Nanofluid model is composed of “Brownian motion’’ and “Thermophoresis’’. Transformed non-dimensional coupled non-linear equations a...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2005